On efficiently solving the subproblems of a level-set method for fused lasso problems
نویسندگان
چکیده
In applying the level-set method developed in [Van den Berg and Friedlander, SIAM J. on Scientific Computing, 31 (2008), pp. 890–912 and SIAM J. on Optimization, 21 (2011), pp. 1201– 1229] to solve the fused lasso problems, one needs to solve a sequence of regularized least squares subproblems. In order to make the level-set method practical, we develop a highly efficient inexact semismooth Newton based augmented Lagrangian method for solving these subproblems. The efficiency of our approach is based on several ingredients that constitute the main contributions of this paper. Firstly, an explicit formula for constructing the generalized Jacobian of the proximal mapping of the fused lasso regularizer is derived. Secondly, the special structure of the generalized Jacobian is carefully extracted and analyzed for the efficient implementation of the semismooth Newton method. Finally, numerical results, including the comparison between our approach and several state-of-the-art solvers, on real data sets are presented to demonstrate the high efficiency and robustness of our proposed algorithm in solving challenging large-scale fused lasso problems.
منابع مشابه
Split Bregman method for large scale fused Lasso
Abstract: Ordering of regression or classification coefficients occurs in many real-world applications. Fused Lasso exploits this ordering by explicitly regularizing the differences between neighboring coefficients through an l1 norm regularizer. However, due to nonseparability and nonsmoothness of the regularization term, solving the fused Lasso problem is computationally demanding. Existing s...
متن کاملA highly efficient semismooth Newton augmented Lagrangian method for solving Lasso problems∗
We develop a fast and robust algorithm for solving large scale convex composite optimization models with an emphasis on the `1-regularized least squares regression (Lasso) problems. Despite the fact that there exist a large number of solvers in the literature for the Lasso problems, we found that no solver can efficiently handle difficult large scale regression problems with real data. By lever...
متن کاملEfficient Generalized Fused Lasso with Application to the Diagnosis of Alzheimer's Disease
Generalized Fused Lasso (GFL) penalizes variables with L1 norms both on the variables and their pairwise differences. GFL is useful when applied to data of which prior information is expressed on a graph. However, the existing algorithms for GFL incur high computational cost and do not scale to high dimensionality. In this paper, we propose a fast and scalable algorithm for GFL. Based on the fa...
متن کاملFast Newton methods for the group fused lasso
We present a new algorithmic approach to the group fused lasso, a convex model that approximates a multi-dimensional signal via an approximately piecewise-constant signal. This model has found many applications in multiple change point detection, signal compression, and total variation denoising, though existing algorithms typically using first-order or alternating minimization schemes. In this...
متن کاملFrom safe screening rules to working sets for faster Lasso-type solvers
Convex sparsity-promoting regularizations are ubiquitous in modern statistical learning. By construction, they yield solutions with few non-zero coefficients, which correspond to saturated constraints in the dual optimization formulation. Working set (WS) strategies are generic optimization techniques that consist in solving simpler problems that only consider a subset of constraints, whose ind...
متن کامل